Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
A comprehensive approach to integrated one health surveillance and responseSurveillance data plays a crucial role in understanding and responding to emerging infectious diseases; here, we learn why adopting a One Health surveillance approach to EIDs can help to protect human, animal, and environmental health. Over 75% of emerging infectious diseases (EIDs) affecting humans are zoonotic diseases with animal hosts, which can be transmitted by waterborne, foodborne, vector-borne, or air-borne pathways. (7) Early detection is important and allows for a rapid response through preventive and control measures. However, early detection of EIDs is hindered by several obstacles, such as climate change, which can alter habitats, leading to shifts in the distribution of disease- carrying vectors like mosquitoes and ticks. This can result in diseases such as malaria, dengue fever, and Lyme disease becoming more common in areas with established transmission or spreading to new areas entirely. (4) Environmental changes such as deforestation and urbanization disrupt ecosystems, increasing the likelihood of zoonotic disease spillover from wildlife to humans. In addition to working at the interface of these changes, detection and tracking of EIDs also requires sharing and standardization of complex data and integrating processes across different regions and health systems.more » « less
-
During the COVID-19 pandemic, wastewater surveillance was widely used to monitor temporal and geographical infection trends. Using this as a foundation, a statewide program for routine wastewater monitoring of gastrointestinal pathogens was established in Oklahoma. The results from 18 months of surveillance showed that wastewater concentrations of Salmonella, Campylobacter, and norovirus exhibit similar seasonal patterns to those observed in reported human cases (F = 4–29, p < 0.05) and that wastewater can serve as an early warning tool for increases in cases, offering between one- and two-weeks lead time. Approximately one third of outbreak alerts in wastewater correlated in time with confirmed outbreaks of Salmonella or Campylobacter and our results further indicated that several outbreaks are likely to go undetected through the traditional surveillance approach currently in place. Better understanding of the true distribution and burden of gastrointestinal infections ultimately facilitates better disease prevention and control and reduces the overall socioeconomic and healthcare related impact of these pathogens. In this respect, wastewater represents a unique opportunity for monitoring infections in real-time, without the need for individual human testing. With increasing demands for sustainable and low-cost disease surveillance, the usefulness of wastewater as a long-term method for tracking infectious disease transmission is likely to become even more pronounced.more » « less
-
Elkins, Christopher A (Ed.)ABSTRACT Wastewater-based epidemiology (WBE) expanded rapidly in response to the COVID-19 pandemic. As the public health emergency has ended, researchers and practitioners are looking to shift the focus of existing wastewater surveillance programs to other targets, including bacteria. Bacterial targets may pose some unique challenges for WBE applications. To explore the current state of the field, the National Science Foundation-funded Research Coordination Network (RCN) on Wastewater Based Epidemiology for SARS-CoV-2 and Emerging Public Health Threats held a workshop in April 2023 to discuss the challenges and needs for wastewater bacterial surveillance. The targets and methods used in existing programs were diverse, with twelve different targets and nine different methods listed. Discussions during the workshop highlighted the challenges in adapting existing programs and identified research gaps in four key areas: choosing new targets, relating bacterial wastewater data to human disease incidence and prevalence, developing methods, and normalizing results. To help with these challenges and research gaps, the authors identified steps the larger community can take to improve bacteria wastewater surveillance. This includes developing data reporting standards and method optimization and validation for bacterial programs. Additionally, more work is needed to understand shedding patterns for potential bacterial targets to better relate wastewater data to human infections. Wastewater surveillance for bacteria can help provide insight into the underlying prevalence in communities, but much work is needed to establish these methods. IMPORTANCEWastewater surveillance was a useful tool to elucidate the burden and spread of SARS-CoV-2 during the pandemic. Public health officials and researchers are interested in expanding these surveillance programs to include bacterial targets, but many questions remain. The NSF-funded Research Coordination Network for Wastewater Surveillance of SARS-CoV-2 and Emerging Public Health Threats held a workshop to identify barriers and research gaps to implementing bacterial wastewater surveillance programs.more » « less
-
Abstract Fire frequency is increasing with climate warming in the boreal regions of interior Alaska, with short fire return intervals (< 50 years) becoming more common. Recent studies suggest these “reburns” will reduce the insulating surface organic layer (SOL) and seedbanks, inhibiting black spruce regeneration and increasing deciduous cover. These changes are projected to amplify soil warming, increasing mineral soil organic carbon (SOC) decomposition rates, and impair re-establishment of understorey vegetation and the SOL. We examined how reburns changed soil temperature, heterotrophic soil respiration (RH), and understorey gross primary production (GPP), and related these to shifts in vegetation composition and SOL depths. Two distinct burn complexes previously covered by spruce were measured; both included areas burned 1x, 2x, and 3x over 60 years and mature (≈ 90 year old) spruce forests underlain by permafrost. A 2.7 °C increase in annual near-surface soil temperatures from 1x to 3x burns was correlated with a decrease in SOL depths and a 1.9 Mg C ha−1increase in annual RH efflux. However, near-surface soil warming accounted for ≤ 23% of higher RH efflux; increases in deciduous overstorey vegetation and root biomass with reburning better correlated with RH than soil temperature. Reburning also warmed deeper soils and reduced the biomass and GPP of understory plants, lessening their potential to offset elevated RH and contribute to SOL development. This suggests that reburning led to losses of mineral SOC previously stored in permafrost due to warming soils and changes in vegetation composition, illustrating how burn frequency creates pathways for accelerated regional C loss.more » « less
-
Tree plantations represent an important component of the global carbon (C) cycle and are expected to increase in prevalence during the 21st century. We examined how silvicultural approaches that optimize economic returns in loblolly pine (Pinus taeda L.) plantations affected the accumulation of C in pools of vegetation, detritus, and mineral soil up to 100 cm across the loblolly pine’s natural range in the southeastern United States. Comparisons of silvicultural treatments included competing vegetation or ‘weed’ control, fertilization, thinning, and varying intensities of silvicultural treatment for 106 experimental plantations and 322 plots. The average age of the sampled plantations was 17 years, and the C stored in vegetation (pine and understory) averaged 82.1 ± 3.0 (±std. error) Mg C ha−1, and 14.3 ± 0.6 Mg C ha−1 in detrital pools (soil organic layers, coarse-woody debris, and soil detritus). Mineral soil C (0–100 cm) averaged 79.8 ± 4.6 Mg C ha−1 across sites. For management effects, thinning reduced vegetation by 35.5 ± 1.2 Mg C ha−1 for all treatment combinations. Weed control and fertilization increased vegetation between 2.3 and 5.7 Mg C ha−1 across treatment combinations, with high intensity silvicultural applications producing greater vegetation C than low intensity (increase of 21.4 ± 1.7 Mg C ha−1). Detrital C pools were negatively affected by thinning where either fertilization or weed control were also applied, and were increased with management intensity. Mineral soil C did not respond to any silvicultural treatments. From these data, we constructed regression models that summarized the C accumulation in detritus and detritus + vegetation in response to independent variables commonly monitored by plantation managers (site index (SI), trees per hectare (TPH) and plantation age (AGE)). The C stored in detritus and vegetation increased on average with AGE and both models included SI and TPH. The detritus model explained less variance (adj. R2 = 0.29) than the detritus + vegetation model (adj. R2 = 0.87). A general recommendation for managers looking to maximize C storage would be to maintain a high TPH and increase SI, with SI manipulation having a greater relative effect. From the model, we predict that a plantation managed to achieve the average upper third SI (26.8) within our observations, and planted at 1500 TPH, could accumulate ~85 Mg C ha−1 by 12 years of age in detritus and vegetation, an amount greater than the region’s average mineral soil C pool. Notably, SI can be increased using both genetic and silviculture technologies.more » « less
An official website of the United States government
